A new conjugate gradient method based on a modified secant condition with its applications in image processing

نویسندگان

چکیده

We propose an effective conjugate gradient method belonging to the class of Dai–Liao methods for solving unconstrained optimization problems. employ a variant modified secant condition and introduce new parameter by problem. The problem combines well-known features linear using some penalty functions. This takes advantage function information as well provide iterations. Our proposed is globally convergent under mild assumptions. examine ability real-world problems from image processing field. Numerical results show that efficient in sense PSNR test. also compare our with existing algorithms collection CUTEr its efficiency.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Two new conjugate gradient methods based on modified secant equations

Following the approach proposed by Dai and Liao, we introduce two nonlinear conjugate gradient methods for unconstrained optimization problems. One of our proposedmethods is based on a modified version of the secant equation proposed by Zhang, Deng and Chen, and Zhang and Xu, and the other is based on the modified BFGS update proposed by Yuan. An interesting feature of our methods is their acco...

متن کامل

A New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems

In this paper‎, ‎two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS})‎ ‎conjugate gradient method are presented to solve unconstrained optimization problems‎. ‎A remarkable property of the proposed methods is that the search direction always satisfies‎ ‎the sufficient descent condition independent of line search method‎, ‎based on eigenvalue analysis‎. ‎The globa...

متن کامل

A conjugate gradient based method for Decision Neural Network training

Decision Neural Network is a new approach for solving multi-objective decision-making problems based on artificial neural networks. Using inaccurate evaluation data, network training has improved and the number of educational data sets has decreased. The available training method is based on the gradient decent method (BP). One of its limitations is related to its convergence speed. Therefore,...

متن کامل

Accelerated conjugate gradient algorithm with modified secant condition for unconstrained optimization

Conjugate gradient algorithms are very powerful methods for solving large-scale unconstrained optimization problems characterized by low memory requirements and strong local and global convergence properties. Over 25 variants of different conjugate gradient methods are known. In this paper we propose a fundamentally different method, in which the well known parameter k β is computed by an appro...

متن کامل

New versions of the Hestenes-Stiefel nonlinear conjugate gradient method based on the secant condition for optimization

Based on the secant condition often satisfied by quasi-Newton methods, two new versions of the Hestenes-Stiefel (HS) nonlinear conjugate gradient method are proposed, which are descent methods even with inexact line searches. The search directions of the proposed methods have the form dk = −θk gk + β H S k dk−1, or dk = −gk + β H S k dk−1 + θk yk−1. When exact line searches are used, the propos...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Rairo-operations Research

سال: 2021

ISSN: ['1290-3868', '0399-0559']

DOI: https://doi.org/10.1051/ro/2020145